Similar words: markov process, tarkovsky, mark off, mark out, remark on, mark out for, tchaikovsky, mountain chain. Meaning: n. a Markov process for which the parameter is discrete time values.
Random good picture Not show
31. According to the measurement of flow field for the opposed tetra-burner gasifier, the gasifier was divided into several regions and the Markov chain states transfer diagram was formed.
32. Markov chain models of cumulative damage have been provided a comprehensive descriptions of cumulative damage processes.
33. Theoretical performance analysis of this protocol with the message length given by the geometric distribution, based on Markov Chain theory, is presented.
34. This paper proves the ordinary nature about information entropy increase of the limited state Markov chain.
35. The series are calculated and analyzed by correlation analysis, stochastic simulation, Monte Carlo, and Markov chain, Monte Carlo, so a group model of risk function is established.
36. This developed the predictable method of Markov chain and widened the practice range.
37. In this paper, the main behaviors of PLN (Probabilistic Logic Neuron) networks are investigated quantitatively using Markov chain theory.
38. A traffic load forecast model by optimally combined Markov chain is presented.
39. The Markov chain model and the transition probability matrices of buried pipeline system are introduced.
40. Markov chain: discrete - time Markov chains, classification of states, ergodic, stationary distribution.
41. Spectral radius is an important global characteristics of an irreducible Markov chain.
42. The mutual independence and wide meaning Bernoulli's test rank is a simple homogeneous Markov chain.
43. Using cumulative probability to estimate the possible stage of earthquake occurrence can fulfill the origin time of Markov chain.
44. Markov Chain is suitable for short-term forecast of great capacity sample data sequence, but gray system forecast method is suitable for medium-term forecast of few capacity sample data sequence.
45. Then a Markov chain model is constructed based on variable - length patterns to detect abnormal behaviors.
46. This paper takes the knowledge base architecture as an example to study the corresponding problem of the alignment. At last, Temporally homogeneous Markov chain is used to model for knowle...
47. Secondly, this paper argues the convergence of Algorithms and introduces the analyzing convergence by Markov chain.
48. In this paper, the completely ergodic Markov chain with rewards is considered. A definition of reliability in probability form is given,[http://sentencedict.com/markov chain.html] based on the frequency form as used in engineering.
49. The work also presents an analytical model using a three - dimensional Markov chain.
50. Given the observed hydrological data, the model can estimate the posterior probability distribution of each location of change-point by using the Monte Carlo Markov Chain (MCMC) sampling method.
51. The model based on S-ALOHA protocol is analyzed in this paper, and the throughput performance of S-ALOHA is computed by the method of Markov chain and the discrete time queue system.
52. The total fatigue damage distribution is analyzed based on a Markov chain of the fatigue damage.
More similar words: markov process, tarkovsky, mark off, mark out, remark on, mark out for, tchaikovsky, mountain chain, chain of mountains, chain, chains, chainsaw, unchain, chain saw, in chains, chained, chain up, fork over, chaining, work over, chain link, enchained, food chain, chainless, long chain, chain rule, unchained, chain mail, chain gang, chain letter.